A parameterizable spatiotemporal representation of popular dance styles for humanoid dancing characters

نویسندگان

  • João Lobato Oliveira
  • Luiz Naveda
  • Fabien Gouyon
  • Luís Paulo Reis
  • Paulo Sousa
  • Marc Leman
چکیده

Dance movements are a complex class of human behavior which convey forms of non-verbal and subjective communication that are performed as cultural vocabularies in all human cultures. The singularity of dance forms imposes fascinating challenges to computer animation and robotics, which in turn presents outstanding opportunities to deepen our understanding about the phenomenon of dance by means of developing models, analyses and syntheses of motion patterns. In this article, we formalize a model for the analysis and representation of popular dance styles of repetitive gestures by specifying the parameters and validation procedures necessary to describe the spatiotemporal elements of the dance movement in relation to its music temporal structure (musical meter). Our representation model is able to precisely describe the structure of dance gestures according to the structure of musical meter, at different temporal resolutions, and is flexible enough to convey the variability of the spatiotemporal relation between music structure and movement in space. It results in a compact and discrete mid-level representation of the dance that can be further applied to algorithms for the generation of movements in different humanoid dancing characters. The validation of our representation model relies upon two hypotheses: (i) the impact of metric resolution and (ii) the impact of variability towards fully and naturally representing a particular dance style of repetitive gestures. We numerically and subjectively assess these hypotheses by analyzing solo dance sequences of Afro-Brazilian samba and American Charleston, captured with a MoCap (Motion Capture) system. From these analyses, we build a set of dance representations modeled with different parameters, and re-synthesize motion sequence variations of the represented dance styles. For specifically assessing the metric hypothesis, we compare the captured dance sequences with repetitive sequences of a fixed dance motion pattern, synthesized at different metric resolutions for both dance styles. In order to evaluate the hypothesis of variability, we compare the same repetitive sequences with others synthesized with variability, by generating and concatenating stochastic variations of the represented dance pattern. The observed results validate the proposition that different dance styles of repetitive gestures might require a minimum and sufficient metric resolution to be fully represented by the proposed representation model. Yet, these also suggest that additional information may be required to synthesize variability in the dance sequences while assuring the naturalness of the performance. Nevertheless, we found evidence that supports the use of the proposed dance representation for flexibly modeling and synthesizing dance sequences from different popular dance styles, with potential developments for the generation of expressive and natural movement profiles onto humanoid dancing characters.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Synthesis of Variable Dancing Styles Based on A Compact Spatiotemporal Representation of Dance

Dance as a complex expressive form of motion is able to convey emotion, meaning and social idiosyncrasies that opens channels for non-verbal communication, and promotes rich cross-modal interactions with music and the environment. As such, realistic dancing characters may incorporate crossmodal information and variability of the dance forms through compact representations that may describe the ...

متن کامل

Humanized Robot Dancing: Humanoid Motion Retargeting Based in a Metrical Representation of Human Dance Styles

Expressiveness and naturalness in robotic motions and behaviors can be replicated with the usage of captured human movements. Considering dance as a complex and expressive type of motion, in this paper we propose a method for generating humanoid dance motions transferred from human motion capture (MoCap) data. Motion data of samba dance was synchronized to samba music, manually annotated by exp...

متن کامل

Synthesis of Dance Performance Based on Analyses of Human Motion and Music

Recent progress in robotics has a great potential, and we are considering to develop a dancing humanoid robot for entertainment robots. In this paper, we propose three fundamental methods of a dancing robot aimed at the sound feedback system in which a robot listens to music and automatically synthesizes dance motion based on the musical features. The first study analyzes the relationship betwe...

متن کامل

Empiric Evaluation of Robot Dancing Framework based on Multi-Modal Events

Musical robots have already inspired the creation of worldwide robotic dancing contests, as RoboCup-Junior's Dance, where school teams, formed by children aged eight to eighteen, put their robots in action, performing dance to music in a display that emphasizes creativity of costumes and movement. This paper describes and assesses a framework for robot dancing edutainment applications. The prop...

متن کامل

Methods for Motion Generation and Interaction with a Humanoid Robot: Case Studies of Dancing and Catching

We focus on creating realistic, adaptable movement for humanoid robots and virtual characters. Here we present motion synthesis of dance movements for a humanoid robot, and interactive behavior for catching. Our approach to motion generation includes collection of example human movements, handling of marker occlusion, extraction of motion parameters, and trajectory generation, all of which must...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • EURASIP J. Audio, Speech and Music Processing

دوره 2012  شماره 

صفحات  -

تاریخ انتشار 2012